1,067 research outputs found

    Application of Sigma metrics in assessing the clinical performance of verified versus non-verified reagents for routine biochemical analytes

    Get PDF
    Introduction: Sigma metrics analysis is considered an objective method to evaluate the performance of a new measurement system. This study was designed to assess the analytical performance of verified versus non-verified reagents for routine biochemical analytes in terms of Sigma metrics. Materials and methods: The coefficient of variation (CV) was calculated according to the mean and standard deviation (SD) derived from the internal quality control for 20 consecutive days. The data were measured on an Architect c16000 analyser with reagents from four manufacturers. Commercial reference materials were used to estimate the bias. Total allowable error (TEa) was based on the CLIA 1988 guidelines. Sigma metrics were calculated in terms of CV, percent bias and TEa. Normalized method decisions charts were built by plotting the normalized bias (biasa: bias%/ TEa) on the Y-axis and the normalized imprecision (CVa: mean CV%/TEa) on the X-axis. Results: The reagents were compared between different manufacturers in terms of the Sigma metrics for relevant analytes. Abbott and Leadmanā€™s verified reagents provided better Sigma metrics for the alanine aminotransferase assay than non-verified reagents (Mindray and Zybio). All reagents performed well for the aspartate aminotransferase and uric acid assays with a sigma of 5 or higher. Abbott achieved the best performance for the urea assay, evidenced by the sigma of 2.83 higher than all reagents, which were below 1-sigma. Conclusion: Sigma metrics analysis system is helpful for clarifying the performance of candidate non-verified reagents in clinical laboratory. Our study suggests that the quality of non-verified reagents should be assessed strictly

    Point Cloud Registration for LiDAR and Photogrammetric Data: a Critical Synthesis and Performance Analysis on Classic and Deep Learning Algorithms

    Full text link
    Recent advances in computer vision and deep learning have shown promising performance in estimating rigid/similarity transformation between unregistered point clouds of complex objects and scenes. However, their performances are mostly evaluated using a limited number of datasets from a single sensor (e.g. Kinect or RealSense cameras), lacking a comprehensive overview of their applicability in photogrammetric 3D mapping scenarios. In this work, we provide a comprehensive review of the state-of-the-art (SOTA) point cloud registration methods, where we analyze and evaluate these methods using a diverse set of point cloud data from indoor to satellite sources. The quantitative analysis allows for exploring the strengths, applicability, challenges, and future trends of these methods. In contrast to existing analysis works that introduce point cloud registration as a holistic process, our experimental analysis is based on its inherent two-step process to better comprehend these approaches including feature/keypoint-based initial coarse registration and dense fine registration through cloud-to-cloud (C2C) optimization. More than ten methods, including classic hand-crafted, deep-learning-based feature correspondence, and robust C2C methods were tested. We observed that the success rate of most of the algorithms are fewer than 40% over the datasets we tested and there are still are large margin of improvement upon existing algorithms concerning 3D sparse corresopondence search, and the ability to register point clouds with complex geometry and occlusions. With the evaluated statistics on three datasets, we conclude the best-performing methods for each step and provide our recommendations, and outlook future efforts.Comment: 7 figure

    New Methods in Seismic Reflection Exploration

    Get PDF
    One of the main problems in the development of the new near-zero-offset method based on the roll-along zero-offset receiver (RAZOR) array is how to simulate this spatially extended array. In this thesis we use Sierra products (standard industry geophysical processing and geological interpretation systems) to carry out numerical experiments on synthetic RAZOR array surveys. Two array design modes are proposed for modelling these surveys. The first is the crooked line mode, which is suitable for experiments with a single RAZOR array. Its advantage is that all the data traces are sorted in the order of acquisition of the RAZOR shot gather directly, and are stored in one disk file without any extra processing. The second method is the five-line mode, which is more suitable for simulating roll-along experiments of the RAZOR array. A new Fortran-77 program RAZORSORT.F is developed for post-raytracing sorting. In addition a new quick display tool QPLOT.F has been developed, supported by the UNIRAS package

    Polymer Nanocomposite Protective Coatings Deposited Using Layer-By-Layer Assembly

    Get PDF
    Protective coatings with the ability to shield the materials underneath are crucial to packaging, flame retardancy, and corrosion prevention. Amongst all the desired properties, barrier performance is critical for protective coatings. Packaging requires gas barrier and corrosion protection becomes more efficient with a good barrier against corrosive species. Polymer-clay composites have shown great potential as protective coatings due to their cost efficiency, ease of production, and good mechanical properties, and more importantly, good barrier due to the torturous pathway created by impermeable clay. Despite these benefits, further improvements are limited because of clay aggregation and misalignment within polymer matrices. Layer-by-layer assembly (LbL) has proven to be a cost-effective technique that enables high clay loading (> 60 wt%) in thin film coatings. This dissertation is focused on utilizing LbL assembly to achieve a high level of clay alignment and loading in unconventional polymer matrices for varying applications, along with the development of new functionalities. Hydrogen-bonded all-polymer systems are highly stretchable but they suffer from low barrier. In an effort to improve barrier performance while maintaining stretchability, clay platelets were introduced to a hydrogen-bonded system by the alternate deposition of poly(ethylene oxide) (PEO) and polyacrylic acid (PAA) mixed with montmorillonite (MMT) clay. This system, with aligned clay, provides the best stretchable oxygen barrier to-date. In addition to MMT, vermiculite (VMT) clay, with larger aspect ratio, is known to impart better barrier when incorporated into LbL systems. In an effort to improve the barrier and flame resistance of biodegradable polymers such as cellulose, VMT clay is paired with modified cellulose nanofibrils (CNF) (that have positively charged surfaces) using LbL assembly. The resulting nanobrick wall thin film structure imparts great improvement in oxygen barrier, flame resistance, and modulus. LbL-assembled polymer-clay films also demonstrate good corrosion protection. A 30-bilayer waterborne polyurethane and VMT coating, with a thickness of 300 nm, provides 100X improvement in impedance and remains effective for at least five days. This is a result of relatively high hydrophobicity and the nanobrick wall structure, making it a potential environmentally-friendly replacement for toxic chromate conversion coatings (CCCs)
    • ā€¦
    corecore